|
Deterministic global optimization is a branch of numerical optimization which focuses on finding the global solutions of an optimization problem whilst providing theoretical guarantees that the reported solution is indeed the global one within some predefined tolerance. The term deterministic global optimization typically refers to complete or rigorous(see below) optimization methods. Rigorous methods converge to the global optimum in finite time. Deterministic global optimization methods are typically used when locating the global solution is a necessity (i.e. when the only naturally occurring state described by a mathematical model is the global minimum of an optimization problem), when it is extremely difficult to find a feasible solution, or simply when the user desires to locate the best possible solution to a problem. == Overview == Neumaier〔Complete Search in Continuous Global Optimization and Constraint Satisfaction, Acta Numerica 2004 (A. Iserles, ed.), Cambridge University Press 2004〕 classified global optimization methods in four categories, depending on their degree of rigour with which they approach the optimum, as follows: * An incomplete method uses clever intuitive heuristics for searching but has no safeguards if the search gets stuck in a local minimum * An asymptotically complete method reaches a gobal minimum with certainty or at least with probability one if allowed to run indefinitely long, but has no means to know when a global minimizer has been found. * A complete method reaches a global minimum with certainty, assuming exact computations and indefinitely long run time, and knows after a finite time that an approximate global minimizer has been found (to within prescribed tolerances). * A rigorous method reaches a global minimum with certainty and within given tolerances even in the presence of rounding errors, except in near-degenerate cases, where the tolerances may be exceeded. Deterministic global optimization methods typically belong to the last two categories. Note that building a rigorous piece of software is extremely difficult as the process requires that all the dependencies are also coded rigorously. Deterministic global optimization methods require ways to rigorously bound function values over regions of space. One could say that a main difference between deterministic and non-deterministic methods in this context is that the former perform calculations over regions of the solution space, whereas the latter perform calculations on single points. This is either done by exploiting particular functional forms (e.g. McCormick relaxations〔Computability of global solutions to factorable nonconvex programs: Part I – Convex underestimating problems, Mathematical Programming, 1976, 1(10), 147–175〕), or using interval analysis in order to work with more general functional forms. In any case bounding is required, which is why deterministic global optimization methods cannot give a rigorous result when working with black-box code, unless that code is explicitly written to also return function bounds. For this reason, it is common for problems in deterministic global optimization to be represented using a computational graph, as it is straightforward to overload all operators such that the resulting function values or derivatives yield interval (rather than scalar) results. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Deterministic global optimization」の詳細全文を読む スポンサード リンク
|